142 research outputs found
Regularized system identification using orthonormal basis functions
Most of existing results on regularized system identification focus on
regularized impulse response estimation. Since the impulse response model is a
special case of orthonormal basis functions, it is interesting to consider if
it is possible to tackle the regularized system identification using more
compact orthonormal basis functions. In this paper, we explore two
possibilities. First, we construct reproducing kernel Hilbert space of impulse
responses by orthonormal basis functions and then use the induced reproducing
kernel for the regularized impulse response estimation. Second, we extend the
regularization method from impulse response estimation to the more general
orthonormal basis functions estimation. For both cases, the poles of the basis
functions are treated as hyperparameters and estimated by empirical Bayes
method. Then we further show that the former is a special case of the latter,
and more specifically, the former is equivalent to ridge regression of the
coefficients of the orthonormal basis functions.Comment: 6 pages, final submission of an contribution for European Control
Conference 2015, uploaded on March 20, 201
Maximum Entropy Kernels for System Identification
A new nonparametric approach for system identification has been recently
proposed where the impulse response is modeled as the realization of a
zero-mean Gaussian process whose covariance (kernel) has to be estimated from
data. In this scheme, quality of the estimates crucially depends on the
parametrization of the covariance of the Gaussian process. A family of kernels
that have been shown to be particularly effective in the system identification
framework is the family of Diagonal/Correlated (DC) kernels. Maximum entropy
properties of a related family of kernels, the Tuned/Correlated (TC) kernels,
have been recently pointed out in the literature. In this paper we show that
maximum entropy properties indeed extend to the whole family of DC kernels. The
maximum entropy interpretation can be exploited in conjunction with results on
matrix completion problems in the graphical models literature to shed light on
the structure of the DC kernel. In particular, we prove that the DC kernel
admits a closed-form factorization, inverse and determinant. These results can
be exploited both to improve the numerical stability and to reduce the
computational complexity associated with the computation of the DC estimator.Comment: Extends results of 2014 IEEE MSC Conference Proceedings
(arXiv:1406.5706
Regularized linear system identification using atomic, nuclear and kernel-based norms: the role of the stability constraint
Inspired by ideas taken from the machine learning literature, new
regularization techniques have been recently introduced in linear system
identification. In particular, all the adopted estimators solve a regularized
least squares problem, differing in the nature of the penalty term assigned to
the impulse response. Popular choices include atomic and nuclear norms (applied
to Hankel matrices) as well as norms induced by the so called stable spline
kernels. In this paper, a comparative study of estimators based on these
different types of regularizers is reported. Our findings reveal that stable
spline kernels outperform approaches based on atomic and nuclear norms since
they suitably embed information on impulse response stability and smoothness.
This point is illustrated using the Bayesian interpretation of regularization.
We also design a new class of regularizers defined by "integral" versions of
stable spline/TC kernels. Under quite realistic experimental conditions, the
new estimators outperform classical prediction error methods also when the
latter are equipped with an oracle for model order selection
- …